Online Boosting Algorithms for Anytime Transfer and Multitask Learning

نویسندگان

  • Boyu Wang
  • Joelle Pineau
چکیده

The related problems of transfer learning and multitask learning have attracted significant attention, generating a rich literature of models and algorithms. Yet most existing approaches are studied in an offline fashion, implicitly assuming that data from different domains are given as a batch. Such an assumption is not valid in many real-world applications where data samples arrive sequentially, and one wants a good learner even from few examples. The goal of our work is to provide sound extensions to existing transfer and multitask learning algorithms such that they can be used in an anytime setting. More specifically, we propose two novel online boosting algorithms, one for transfer learning and one for multitask learning, both designed to leverage the knowledge of instances in other domains. The experimental results show state-of-the-art empirical performance on standard benchmarks, and we present results of using our methods for effectively detecting new seizures in patients with epilepsy from very few previous samples.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Generalized Dictionary for Multitask Learning with Boosting

While multitask learning has been extensively studied, most existing methods rely on linear models (e.g. linear regression, logistic regression), which may fail in dealing with more general (nonlinear) problems. In this paper, we present a new approach that combines dictionary learning with gradient boosting to achieve multitask learning with general (nonlinear) basis functions. Specifically, f...

متن کامل

Algorithms and Applications for Multitask Learning

Multitask Learning is an inductive transfer method that improves generalization by using domain information implicit in the training signals of related tasks as an inductive bias. It does this by learning multiple tasks in parallel using a shared representation. Mul-titask transfer in connectionist nets has already been proven. But questions remain about how often training data for useful extra...

متن کامل

MULTIBOOST: A Multi-purpose Boosting Package

The MULTIBOOST package provides a fast C++ implementation of multi-class/multi-label/multitask boosting algorithms. It is based on ADABOOST.MH but it also implements popular cascade classifiers and FILTERBOOST. The package contains common multi-class base learners (stumps, trees, products, Haar filters). Further base learners and strong learners following the boosting paradigm can be easily imp...

متن کامل

Online and Adaptive Methods for Multitask Learning

The power of joint learning in multiple tasks arises from the transfer of relevant knowledge across said tasks, especially from information-rich tasks to informationpoor ones. Lifelong learning, on the other hand, provides an efficient way to learn new tasks faster by utilizing the knowledge learned from the previous tasks and prevent catastrophic forgetting or significantly degrading performan...

متن کامل

Memory Constraint Online Multitask Classification

We investigate online kernel algorithms which simultaneously process multiple classification tasks while a fixed constraint is imposed on the size of their active sets. We focus in particular on the design of algorithms that can efficiently deal with problems where the number of tasks is extremely high and the task data are large scale. Two new projectionbased algorithms are introduced to effic...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2015